Helpful Information
 
 
Category: MS Access
MS Access -> MySql replication

Having read the articles here it seems i'm not the only one with this problem: "replicate a local MS Access database (the database people are working on) to a web MySQL database (the database web visitors can browse)" ...

We all know there is no built-in solution in Access nor MySql ... so I developed one ... I had to add timestamp fields in each table to know when a record was modified/Added/Deleted as well as a Status field indicating if the record was in the process of being deleted/updated or actually deleted/updated/etc ... and then my VB app connects to MySql and insert/delete data according to what's in the Access db ...

I can tell you this is a brain nightmare ... managing primary key fields, as well as the Timestamp/Status fields on two separate databases (and of different type) across several linked tables is really hard ... and once all is done, it's painfully slow .... all this work for that ?? :(

So, I think this is not the good solution ... it's overcomplicating a database structure for poor results ...

I'm now thinking about another solution, rather server-sided because I've noticed that if I run the queries on an offline mysql server, it's much faster than on a web mysql server ... this seems logical: no need to transfer data using the internet ...

Also, I would leave off TimeStamp/Status for all tables except the main table .... my database is a CD database with tables like "Albums" linked to a "Tracks" table linked to a "Songs" table etc ... "Albums" is clearly the main table where all tables link directly or indirectly ... So the idea would be to export the data per Album to a user-defined file format ... send this file through ftp to the website and run a script (perl or preferably PHP since I've never done anything in Perl) that would do all the needed checks to delete/add records where needed ... e.g. "oh we have a song title here, do we have this song in the database yet ? if so, link the current track with the actual song_id, if not, create a new song_id and then link it to the track" ...

Some advantages to this technique:
- there is no need to constantly send data to and from the server ... all the jobs are on the server ... this should really speed things ...
- there is no need to transfer the primary keys, so I can use autoincrement primary key fields on both databases and I don't care if the same key is used on both databases

Disadvantages:
- the offline database and web database are not exactly the same (since they don't have the same primary keys) so it's not true replication ...
- any change to the database structure will mean rewrite a part of the script to handle the changes (this was no true with the previous method: all I had to do was browse all the tables, get the TimeStamp and Status and send the data no matter what it was representing)
- also, and this is my question to this board: i'm not sure that php is fast/strong enough to do that without harming too much the server performance ...

What do you think ?

Have you considered MyODBC, available on the MySQL website?

You can link your Access database to the MySQL tables and when updates or whatever are done, it goes directly into the MySQL database. It's transparent to the people using Access, they think it's all being done on their computer.

Might be an option??

---John Holmes...

Well yes, I have considered using myOdbc but I quickly rejected this option because I want to update my database offline and then send the modifications from time to time ... but thanks for the suggestion anyway ;)

MyOdbc is indeed a good solution to link VB to a MySQL database if you can cope with all its defaults ... personnally, and I recommend this once again to all users here, I use a VB class that encapsulates the MySQL C API and transforms the records queried to ADO compatible recordsets ... this class is available here

http://www.planetsourcecode.com/xq/ASP/txtCodeId.22472/lngWId.1/qx/vb/scripts/ShowCode.htm

there are a few restrictions but so far, nothing I couldn't overcome.

To anyone concerned, I've tested the method I describe above and it works great !

My database is a CD database ... it's on the web as a mysql db but I use a (homemade) software to manage the database. Several people use this software. The software has a MS Access copy of the mysql database. There is no direct link like myodbc because I'm the only one authorised to modify the web database (and also because one of the purpose of my software is to allow people to browse offline the database, that is, without being connected). However, users can send me modifications to the database and I validate (or not) these modifications. Once it's been validated, I can send the data to the web database and they can download it if they want ...

The main table is "Albums" and it's linked to an "Artist" table and cross-linked to a "Publishers" table (allowing multiple publisher per album). The "Albums" table is also linked to a "Tracks" table. The "tracks" table is linked to "Songs" table (itself linked to "Composer" table) and linked to "Song version" table (the structure of the "versions" and related tables is a little more complex because some song versions are inherited from others and so there is a hierarchical structure in there) ...

... as you see, all tables can be linked directly or indirectly to "Albums" ... knowing that, I can export all the data for one album into a file (I've done this using a proprietary file format but it should work too as a regular sql export) ... this file is uploaded to the webserver ... I then execute a php script ...

The php script, first deletes all the data related to the album in the mysql database (without going too much into details, this is not a simple "delete" across several tables as I must check if info is used elsewhere in the database -- e.g. the sript will not delete the entry "Beatles" in the "Artist" table if there's another cd in the database that uses this entry -- of course this is even more complicated with inherited song versions but that's another story) ...

The delete process (including all checks) usually takes less than one second per album...

Then the script opens the uploaded file and insert all the data in the database ... this takes less than one second too ...

All in all, it's approx. 50 times faster to use this method (100 files were imported in 2min 20sec) rather than using a (homemade I admit) replication-like algorithm because such an algorithm needs to communicate through internet to check data on the web server ... this is where the bottleneck is ... and I'm not sure the modem line speed has much to do with it, I've done all my tests on an ADSL line (that's 128K/s) ... my guess is rather that the protocol used by the mysql server is not very optimized for remote access ...










privacy (GDPR)